Markov chain - определение. Что такое Markov chain
Diclib.com
Словарь ChatGPT
Введите слово или словосочетание на любом языке 👆
Язык:

Перевод и анализ слов искусственным интеллектом ChatGPT

На этой странице Вы можете получить подробный анализ слова или словосочетания, произведенный с помощью лучшей на сегодняшний день технологии искусственного интеллекта:

  • как употребляется слово
  • частота употребления
  • используется оно чаще в устной или письменной речи
  • варианты перевода слова
  • примеры употребления (несколько фраз с переводом)
  • этимология

Что (кто) такое Markov chain - определение

STOCHASTIC MODEL DESCRIBING A SEQUENCE OF POSSIBLE EVENTS IN WHICH THE PROBABILITY OF EACH EVENT DEPENDS ONLY ON THE STATE ATTAINED IN THE PREVIOUS EVENT
Markov process; Markov sequence; Markov chains; Markov analysis; Markovian process; Markovian property; Markov predictor; Markoff chain; Markov Chain; Markoff Chain; Transition probabilities; Absorbing state; Markov Chaining; Equilibrium distribution; Markov-Chain; Markhow chain; Irreducible Markov chain; Transition probability; Markov Chains; Homogeneous Markov chain; Markov Processes; Markov Sequences; Markov Process; Markovian chain; Embedded Markov chain; Positive recurrent; Transition density; Transitional probability; Markov text generators; Markov text; Applications of Markov chains
  • Russian mathematician [[Andrey Markov]]
Найдено результатов: 1090
Markov chain         
<probability> (Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred. A Markov process is governed by a Markov chain. In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions. [Better explanation?] (1995-02-23)
Markov process         
<probability, simulation> A process in which the sequence of events can be described by a Markov chain. (1995-02-23)
Markov chain Monte Carlo         
  • Convergence of the [[Metropolis–Hastings algorithm]]. Markov chain Monte Carlo attempts to approximate the blue distribution with the orange distribution.
CLASS OF ALGORITHMS
Random walk Monte Carlo; Markov Chain Monte Carlo; Markov chain monte carlo; Monte Carlo markov chain; Markov clustering; MCMC methods; MCMC method; Markov Chain Monte Carlo Simulations; Markov chain Monte Carlo method; Markov chain Monte Carlo methods
In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain.
Additive Markov chain         
User:Greender/Additive Markov chain; Additive markov chain
In probability theory, an additive Markov chain is a Markov chain with an additive conditional probability function. Here the process is a discrete-time Markov chain of order m and the transition probability to a state at the next time is a sum of functions, each depending on the next state and one of the m previous states.
Continuous-time Markov chain         
  • Transition graph with transition probabilities, exemplary for the states 1, 5, 6 and 8. There is a bidirectional secret passage between states 2 and 8.
STOCHASTIC PROCESS THAT SATISFIES THE MARKOV PROPERTY (SOMETIMES CHARACTERIZED AS "MEMORYLESSNESS")
Continuous time Markov chain; Ctmc; CTMC; Continuous-time Markov Process; Continuous-time Markov process
A continuous-time Markov chain (CTMC) is a continuous stochastic process in which, for each state, the process will change state according to an exponential random variable and then move to a different state as specified by the probabilities of a stochastic matrix. An equivalent formulation describes the process as changing state according to the least value of a set of exponential random variables, one for each possible state it can move to, with the parameters determined by the current state.
Discrete-time Markov chain         
A SEQUENCE OF RANDOM VARIABLES IN WHICH THE VALUE OF THE NEXT VARIABLE DEPENDS ONLY ON THE VALUE OF THE CURRENT VARIABLE, AND NOT ANY VARIABLES IN THE PAST
DTMC; Discrete-time Markov process; Discrete time Markov chain; Discrete time Markov chains
In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. For instance, a machine may have two states, A and E.
Nearly completely decomposable Markov chain         
Nearly completely decomposable; Nearly completely decomposable matrix; NCD Markov chain
In probability theory, a nearly completely decomposable (NCD) Markov chain is a Markov chain where the state-space can be partitioned in such a way that movement within a partition occurs much more frequently than movement between partitions. Particularly efficient algorithms exist to compute the stationary distribution of Markov chains with this property.
Andrey Markov         
RUSSIAN MATHEMATICIAN
Andrei Andreevich Markov; A. A. Markov; Markov, A.A.; Andrey Andreyevich Markov; Markov Andrei; Andrei Andreyevich Markov; Andrey markov; Андре́й Андре́евич Ма́рков

Andrey Andreyevich Markov (14 June 1856 – 20 July 1922) was a Russian mathematician best known for his work on stochastic processes. A primary subject of his research later became known as the Markov chain.

Markov and his younger brother Vladimir Andreevich Markov (1871–1897) proved the Markov brothers' inequality. His son, another Andrey Andreyevich Markov (1903–1979), was also a notable mathematician, making contributions to constructive mathematics and recursive function theory.

Sergei Markov         
RUSSIAN POLITICAL SCIENTIST
Sergei A. Markov; Sergey Alexandrovich Markov
Sergei Alexandrovich Markov (born 1958, ) is a Russian political scientist, journalist and former close advisor to Russian President Vladimir Putin. He is a Doctor of Political Science, assistant professor of Public Policy department of Faculty of Philosophy at Moscow State University, professor of the Faculty of Political Science at the Moscow State Institute of International Relations (MGIMO-University), director of the Institute of Political Studies.
Moisey Markov         
RUSSIAN PARTICLE PHYSICIST
Moisey A. Markov; Moisey Alexandrovich Markov; Moisei Markov
Moisey Alexandrovich Markov (; 13 May 1908 Rasskazovo, Tambov Governorate, Russian Empire - 1 November 1994, Moscow, Russia) was a Soviet physicist-theorist who mostly worked in the area of quantum mechanics, nuclear physics and particle physics He is particularly known for having proposed the idea of underwater neutrino telescopes in 1960M.A.

Википедия

Markov chain

A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov.

Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, currency exchange rates and animal population dynamics.

Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability distributions, and have found application in Bayesian statistics, thermodynamics, statistical mechanics, physics, chemistry, economics, finance, signal processing, information theory and speech processing.

The adjectives Markovian and Markov are used to describe something that is related to a Markov process.